Face Generation

In this project, you'll use generative adversarial networks to generate new images of faces.

Get the Data

You'll be using two datasets in this project:

  • MNIST
  • CelebA

Since the celebA dataset is complex and you're doing GANs in a project for the first time, we want you to test your neural network on MNIST before CelebA. Running the GANs on MNIST will allow you to see how well your model trains sooner.

If you're using FloydHub, set data_dir to "/input" and use the FloydHub data ID "R5KrjnANiKVhLWAkpXhNBe".

In [1]:
data_dir = './data'

# FloydHub - Use with data ID "R5KrjnANiKVhLWAkpXhNBe"
#data_dir = '/input'


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import helper

helper.download_extract('mnist', data_dir)
helper.download_extract('celeba', data_dir)
Found mnist Data
Found celeba Data

Explore the Data

MNIST

As you're aware, the MNIST dataset contains images of handwritten digits. You can view the first number of examples by changing show_n_images.

In [2]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
%matplotlib inline
import os
from glob import glob
from matplotlib import pyplot

mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'mnist/*.jpg'))[:show_n_images], 28, 28, 'L')
pyplot.imshow(helper.images_square_grid(mnist_images, 'L'), cmap='gray')
Out[2]:
<matplotlib.image.AxesImage at 0x10c0419b0>

CelebA

The CelebFaces Attributes Dataset (CelebA) dataset contains over 200,000 celebrity images with annotations. Since you're going to be generating faces, you won't need the annotations. You can view the first number of examples by changing show_n_images.

In [3]:
show_n_images = 25

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
mnist_images = helper.get_batch(glob(os.path.join(data_dir, 'img_align_celeba/*.jpg'))[:show_n_images], 28, 28, 'RGB')
pyplot.imshow(helper.images_square_grid(mnist_images, 'RGB'))
Out[3]:
<matplotlib.image.AxesImage at 0x10f97fb00>

Preprocess the Data

Since the project's main focus is on building the GANs, we'll preprocess the data for you. The values of the MNIST and CelebA dataset will be in the range of -0.5 to 0.5 of 28x28 dimensional images. The CelebA images will be cropped to remove parts of the image that don't include a face, then resized down to 28x28.

The MNIST images are black and white images with a single color channel while the CelebA images have 3 color channels (RGB color channel).

Build the Neural Network

You'll build the components necessary to build a GANs by implementing the following functions below:

  • model_inputs
  • discriminator
  • generator
  • model_loss
  • model_opt
  • train

Check the Version of TensorFlow and Access to GPU

This will check to make sure you have the correct version of TensorFlow and access to a GPU

In [4]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
from distutils.version import LooseVersion
import warnings
import tensorflow as tf

# Check TensorFlow Version
assert LooseVersion(tf.__version__) >= LooseVersion('1.0'), 'Please use TensorFlow version 1.0 or newer.  You are using {}'.format(tf.__version__)
print('TensorFlow Version: {}'.format(tf.__version__))

# Check for a GPU
if not tf.test.gpu_device_name():
    warnings.warn('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Default GPU Device: {}'.format(tf.test.gpu_device_name()))
TensorFlow Version: 1.1.0
/Users/Ken/anaconda2/envs/deep2/lib/python3.6/site-packages/ipykernel_launcher.py:14: UserWarning: No GPU found. Please use a GPU to train your neural network.
  

Input

Implement the model_inputs function to create TF Placeholders for the Neural Network. It should create the following placeholders:

  • Real input images placeholder with rank 4 using image_width, image_height, and image_channels.
  • Z input placeholder with rank 2 using z_dim.
  • Learning rate placeholder with rank 0.

Return the placeholders in the following the tuple (tensor of real input images, tensor of z data)

In [5]:
import problem_unittests as tests

def model_inputs(image_width, image_height, image_channels, z_dim):
    """
    Create the model inputs
    :param image_width: The input image width
    :param image_height: The input image height
    :param image_channels: The number of image channels
    :param z_dim: The dimension of Z
    :return: Tuple of (tensor of real input images, tensor of z data, learning rate)
    """
    # TODO: Implement Function
    input_images = tf.placeholder(tf.float32, (None, image_width, image_height, image_channels), name='input_images') 
    z_data = tf.placeholder(tf.float32, (None, z_dim), name='z_data') 
    learning_rate = tf.placeholder(tf.float32, name='learning_rate') 
    return (input_images, z_data, learning_rate)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_inputs(model_inputs)
Tests Passed

Discriminator

Implement discriminator to create a discriminator neural network that discriminates on images. This function should be able to reuse the variables in the neural network. Use tf.variable_scope with a scope name of "discriminator" to allow the variables to be reused. The function should return a tuple of (tensor output of the discriminator, tensor logits of the discriminator).

In [12]:
def discriminator(images, reuse=False):
    """
    Create the discriminator network
    :param images: Tensor of input image(s)
    :param reuse: Boolean if the weights should be reused
    :return: Tuple of (tensor output of the discriminator, tensor logits of the discriminator)
    """
    # TODO: Implement Function
    with tf.variable_scope('discriminator', reuse=reuse):
        
        alpha = 0.2
        x1 = tf.layers.conv2d(images, 64, 5, strides=2, padding='same', 
                              kernel_initializer=tf.contrib.layers.xavier_initializer(seed=2))
        relu1 = tf.maximum(alpha * x1, x1)
        
        x2 = tf.layers.conv2d(relu1, 128, 5, strides=2, padding='same', 
                              kernel_initializer=tf.contrib.layers.xavier_initializer(seed=2))
        bn2 = tf.layers.batch_normalization(x2, training=True)
        relu2 = tf.maximum(alpha * bn2, bn2)
        
        x3 = tf.layers.conv2d(relu2, 256, 5, strides=2, padding='same', 
                              kernel_initializer=tf.contrib.layers.xavier_initializer(seed=2))
        bn3 = tf.layers.batch_normalization(x3, training=True)
        relu3 = tf.maximum(alpha * bn3, bn3)
        
        flat = tf.reshape(relu3, (-1, 4*4*256))
        logits = tf.layers.dense(flat, 1)
        out = tf.sigmoid(logits)
    return (out, logits)


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_discriminator(discriminator, tf)
Tests Passed

Generator

Implement generator to generate an image using z. This function should be able to reuse the variables in the neural network. Use tf.variable_scope with a scope name of "generator" to allow the variables to be reused. The function should return the generated 28 x 28 x out_channel_dim images.

In [13]:
def generator(z, out_channel_dim, is_train=True):
    """
    Create the generator network
    :param z: Input z
    :param out_channel_dim: The number of channels in the output image
    :param is_train: Boolean if generator is being used for training
    :return: The tensor output of the generator
    """
    # TODO: Implement Function
    with tf.variable_scope('generator', reuse=not is_train):
        alpha = 0.2
    
        x1 = tf.layers.dense(z, 2*2*512)
        x1 = tf.reshape(x1, (-1, 2, 2, 512))
        x1 = tf.layers.batch_normalization(x1, training=is_train)
        x1 = tf.maximum(alpha * x1, x1)
    
        x2 = tf.layers.conv2d_transpose(x1, 256, 5, strides=2, padding='valid')
        x2 = tf.layers.batch_normalization(x2, training=is_train)
        x2 = tf.maximum(alpha * x2, x2)
    
        x3 = tf.layers.conv2d_transpose(x2, 128, 5, strides=2, padding='same')
        x3 = tf.layers.batch_normalization(x3, training=is_train)
        x3 = tf.maximum(alpha * x3, x3)
    
        logits = tf.layers.conv2d_transpose(x3, out_channel_dim, 5, strides=2, padding='same')
        out = tf.tanh(logits)
    
        return out

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_generator(generator, tf)
Tests Passed

Loss

Implement model_loss to build the GANs for training and calculate the loss. The function should return a tuple of (discriminator loss, generator loss). Use the following functions you implemented:

  • discriminator(images, reuse=False)
  • generator(z, out_channel_dim, is_train=True)
In [15]:
def model_loss(input_real, input_z, out_channel_dim):
    """
    Get the loss for the discriminator and generator
    :param input_real: Images from the real dataset
    :param input_z: Z input
    :param out_channel_dim: The number of channels in the output image
    :return: A tuple of (discriminator loss, generator loss)
    """
    # TODO: Implement Function
    g_model = generator(input_z, out_channel_dim)
    d_model_real, d_logits_real = discriminator(input_real)
    d_model_fake, d_logits_fake = discriminator(g_model, reuse=True)
    
    d_loss_real = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_real, labels=tf.ones_like(d_model_real)*0.9))
    d_loss_fake = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_fake, labels=tf.zeros_like(d_model_fake)))
    g_loss = tf.reduce_mean(
        tf.nn.sigmoid_cross_entropy_with_logits(logits=d_logits_fake, labels=tf.ones_like(d_model_fake)))
    d_loss = d_loss_real + d_loss_fake
    return d_loss, g_loss


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_loss(model_loss)
Tests Passed

Optimization

Implement model_opt to create the optimization operations for the GANs. Use tf.trainable_variables to get all the trainable variables. Filter the variables with names that are in the discriminator and generator scope names. The function should return a tuple of (discriminator training operation, generator training operation).

In [16]:
def model_opt(d_loss, g_loss, learning_rate, beta1):
    """
    Get optimization operations
    :param d_loss: Discriminator loss Tensor
    :param g_loss: Generator loss Tensor
    :param learning_rate: Learning Rate Placeholder
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :return: A tuple of (discriminator training operation, generator training operation)
    """
    # TODO: Implement Function
    t_vars = tf.trainable_variables()
    d_vars = [var for var in t_vars if var.name.startswith('discriminator')]
    g_vars = [var for var in t_vars if var.name.startswith('generator')]
#     d_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(d_loss, var_list=d_vars)
#     g_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(g_loss, var_list=g_vars)
    with tf.control_dependencies(tf.get_collection(tf.GraphKeys.UPDATE_OPS)):
        d_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(d_loss, var_list=d_vars)
        g_train_opt = tf.train.AdamOptimizer(learning_rate, beta1=beta1).minimize(g_loss, var_list=g_vars)
    return d_train_opt, g_train_opt


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_model_opt(model_opt, tf)
Tests Passed

Neural Network Training

Show Output

Use this function to show the current output of the generator during training. It will help you determine how well the GANs is training.

In [17]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import numpy as np

def show_generator_output(sess, n_images, input_z, out_channel_dim, image_mode):
    """
    Show example output for the generator
    :param sess: TensorFlow session
    :param n_images: Number of Images to display
    :param input_z: Input Z Tensor
    :param out_channel_dim: The number of channels in the output image
    :param image_mode: The mode to use for images ("RGB" or "L")
    """
    cmap = None if image_mode == 'RGB' else 'gray'
    z_dim = input_z.get_shape().as_list()[-1]
    example_z = np.random.uniform(-1, 1, size=[n_images, z_dim])

    samples = sess.run(
        generator(input_z, out_channel_dim, False),
        feed_dict={input_z: example_z})

    images_grid = helper.images_square_grid(samples, image_mode)
    pyplot.imshow(images_grid, cmap=cmap)
    pyplot.show()

Train

Implement train to build and train the GANs. Use the following functions you implemented:

  • model_inputs(image_width, image_height, image_channels, z_dim)
  • model_loss(input_real, input_z, out_channel_dim)
  • model_opt(d_loss, g_loss, learning_rate, beta1)

Use the show_generator_output to show generator output while you train. Running show_generator_output for every batch will drastically increase training time and increase the size of the notebook. It's recommended to print the generator output every 100 batches.

In [20]:
def train(epoch_count, batch_size, z_dim, learning_rate, beta1, get_batches, data_shape, data_image_mode):
    """
    Train the GAN
    :param epoch_count: Number of epochs
    :param batch_size: Batch Size
    :param z_dim: Z dimension
    :param learning_rate: Learning Rate
    :param beta1: The exponential decay rate for the 1st moment in the optimizer
    :param get_batches: Function to get batches
    :param data_shape: Shape of the data
    :param data_image_mode: The image mode to use for images ("RGB" or "L")
    """
    # TODO: Build Model
    _, image_width, image_height, image_channels = data_shape
    
    real_input, z_input, lr = model_inputs(image_width, image_height, image_channels, z_dim)
    
    d_loss, g_loss = model_loss(real_input, z_input, image_channels)
    d_train_opt, g_train_opt = model_opt(d_loss, g_loss, learning_rate, beta1)
    
    n_images = 25
    steps = 0
    print_interval = 10
    show_interval = 30
    losses = []
    
    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        for epoch_i in range(epoch_count):
            for batch_images in get_batches(batch_size):
                # TODO: Train Model
                steps += 1
                batch_images *= 2.0
                z_sample = np.random.uniform(-1, 1, (batch_size, z_dim))
                
                _ = sess.run(d_train_opt, feed_dict={real_input: batch_images, z_input: z_sample, lr: learning_rate})
#                 _ = sess.run(g_train_opt, feed_dict={z_input: z_sample, lr: learning_rate})
#                 _ = sess.run(g_train_opt, feed_dict={z_input: z_sample, lr: learning_rate})
                _ = sess.run(g_train_opt, feed_dict={z_input: z_sample, real_input: batch_images})
                _ = sess.run(g_train_opt, feed_dict={z_input: z_sample, real_input: batch_images})
                
                if steps % print_interval == 0:
                    train_loss_d = d_loss.eval({z_input: z_sample, real_input: batch_images})
                    train_loss_g = g_loss.eval({z_input: z_sample})
                    
                    print("Epoch {}/{}\t".format(epoch_i+1, epoch_count),
                          "Discriminator Loss: {:.4f}\t".format(train_loss_d),
                          "Generator Loss: {:.4f}".format(train_loss_g))
                    losses.append((train_loss_d, train_loss_g))
                
                if steps % show_interval == 0:
                    show_generator_output(sess, n_images, z_input, image_channels, data_image_mode)
                

MNIST

Test your GANs architecture on MNIST. After 2 epochs, the GANs should be able to generate images that look like handwritten digits. Make sure the loss of the generator is lower than the loss of the discriminator or close to 0.

In [21]:
batch_size = 64
z_dim = 100
learning_rate = 0.002
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 2

mnist_dataset = helper.Dataset('mnist', glob(os.path.join(data_dir, 'mnist/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, mnist_dataset.get_batches,
          mnist_dataset.shape, mnist_dataset.image_mode)
Epoch 1/2	 Discriminator Loss: 2.4219	 Generator Loss: 8.0443
Epoch 1/2	 Discriminator Loss: 1.7427	 Generator Loss: 2.5461
Epoch 1/2	 Discriminator Loss: 1.0912	 Generator Loss: 1.0808
Epoch 1/2	 Discriminator Loss: 0.5389	 Generator Loss: 2.4516
Epoch 1/2	 Discriminator Loss: 1.1407	 Generator Loss: 0.8731
Epoch 1/2	 Discriminator Loss: 1.1307	 Generator Loss: 0.8026
Epoch 1/2	 Discriminator Loss: 1.3832	 Generator Loss: 0.8704
Epoch 1/2	 Discriminator Loss: 1.7970	 Generator Loss: 0.3666
Epoch 1/2	 Discriminator Loss: 2.2337	 Generator Loss: 0.3148
Epoch 1/2	 Discriminator Loss: 1.2421	 Generator Loss: 0.7969
Epoch 1/2	 Discriminator Loss: 1.4911	 Generator Loss: 0.6467
Epoch 1/2	 Discriminator Loss: 1.4519	 Generator Loss: 0.5665
Epoch 1/2	 Discriminator Loss: 1.3739	 Generator Loss: 0.8382
Epoch 1/2	 Discriminator Loss: 1.0413	 Generator Loss: 0.9815
Epoch 1/2	 Discriminator Loss: 0.7674	 Generator Loss: 1.5204
Epoch 1/2	 Discriminator Loss: 1.4534	 Generator Loss: 0.6545
Epoch 1/2	 Discriminator Loss: 1.4927	 Generator Loss: 0.7513
Epoch 1/2	 Discriminator Loss: 1.3592	 Generator Loss: 1.2921
Epoch 1/2	 Discriminator Loss: 1.2665	 Generator Loss: 1.2166
Epoch 1/2	 Discriminator Loss: 1.5134	 Generator Loss: 1.8835
Epoch 1/2	 Discriminator Loss: 1.3668	 Generator Loss: 1.9324
Epoch 1/2	 Discriminator Loss: 1.1775	 Generator Loss: 1.0555
Epoch 1/2	 Discriminator Loss: 1.4031	 Generator Loss: 1.0703
Epoch 1/2	 Discriminator Loss: 1.4496	 Generator Loss: 0.8914
Epoch 1/2	 Discriminator Loss: 1.7789	 Generator Loss: 0.4365
Epoch 1/2	 Discriminator Loss: 1.4162	 Generator Loss: 1.2338
Epoch 1/2	 Discriminator Loss: 1.2375	 Generator Loss: 0.8177
Epoch 1/2	 Discriminator Loss: 1.4672	 Generator Loss: 0.5415
Epoch 1/2	 Discriminator Loss: 1.4750	 Generator Loss: 0.5790
Epoch 1/2	 Discriminator Loss: 1.4360	 Generator Loss: 0.7527
Epoch 1/2	 Discriminator Loss: 1.6226	 Generator Loss: 0.7980
Epoch 1/2	 Discriminator Loss: 1.2469	 Generator Loss: 1.0322
Epoch 1/2	 Discriminator Loss: 1.3750	 Generator Loss: 0.8564
Epoch 1/2	 Discriminator Loss: 1.3701	 Generator Loss: 0.9844
Epoch 1/2	 Discriminator Loss: 1.8572	 Generator Loss: 0.3705
Epoch 1/2	 Discriminator Loss: 1.2368	 Generator Loss: 1.0049
Epoch 1/2	 Discriminator Loss: 1.4004	 Generator Loss: 0.6589
Epoch 1/2	 Discriminator Loss: 1.5314	 Generator Loss: 0.4517
Epoch 1/2	 Discriminator Loss: 1.3118	 Generator Loss: 0.6910
Epoch 1/2	 Discriminator Loss: 1.4063	 Generator Loss: 0.7461
Epoch 1/2	 Discriminator Loss: 1.4916	 Generator Loss: 0.5280
Epoch 1/2	 Discriminator Loss: 1.4973	 Generator Loss: 1.0216
Epoch 1/2	 Discriminator Loss: 1.3441	 Generator Loss: 0.9813
Epoch 1/2	 Discriminator Loss: 1.4069	 Generator Loss: 0.6173
Epoch 1/2	 Discriminator Loss: 1.3058	 Generator Loss: 0.6188
Epoch 1/2	 Discriminator Loss: 1.4699	 Generator Loss: 0.7140
Epoch 1/2	 Discriminator Loss: 1.2328	 Generator Loss: 1.1807
Epoch 1/2	 Discriminator Loss: 1.4588	 Generator Loss: 0.5312
Epoch 1/2	 Discriminator Loss: 1.3076	 Generator Loss: 1.1668
Epoch 1/2	 Discriminator Loss: 1.4162	 Generator Loss: 0.5160
Epoch 1/2	 Discriminator Loss: 1.3890	 Generator Loss: 1.2047
Epoch 1/2	 Discriminator Loss: 1.6571	 Generator Loss: 0.4194
Epoch 1/2	 Discriminator Loss: 1.2283	 Generator Loss: 1.0452
Epoch 1/2	 Discriminator Loss: 1.3735	 Generator Loss: 1.6814
Epoch 1/2	 Discriminator Loss: 1.3180	 Generator Loss: 0.8914
Epoch 1/2	 Discriminator Loss: 1.3690	 Generator Loss: 1.1876
Epoch 1/2	 Discriminator Loss: 1.3560	 Generator Loss: 1.5037
Epoch 1/2	 Discriminator Loss: 1.5106	 Generator Loss: 0.4848
Epoch 1/2	 Discriminator Loss: 1.4911	 Generator Loss: 0.5093
Epoch 1/2	 Discriminator Loss: 1.4209	 Generator Loss: 1.3598
Epoch 1/2	 Discriminator Loss: 1.4128	 Generator Loss: 0.9288
Epoch 1/2	 Discriminator Loss: 1.5102	 Generator Loss: 1.1366
Epoch 1/2	 Discriminator Loss: 1.7773	 Generator Loss: 1.5921
Epoch 1/2	 Discriminator Loss: 1.3853	 Generator Loss: 0.6242
Epoch 1/2	 Discriminator Loss: 1.7283	 Generator Loss: 1.7780
Epoch 1/2	 Discriminator Loss: 1.3168	 Generator Loss: 0.7886
Epoch 1/2	 Discriminator Loss: 1.2765	 Generator Loss: 0.7895
Epoch 1/2	 Discriminator Loss: 1.3066	 Generator Loss: 0.8723
Epoch 1/2	 Discriminator Loss: 1.4863	 Generator Loss: 0.9436
Epoch 1/2	 Discriminator Loss: 1.6103	 Generator Loss: 1.3180
Epoch 1/2	 Discriminator Loss: 1.8492	 Generator Loss: 0.2920
Epoch 1/2	 Discriminator Loss: 1.4974	 Generator Loss: 0.8459
Epoch 1/2	 Discriminator Loss: 1.5853	 Generator Loss: 0.4083
Epoch 1/2	 Discriminator Loss: 1.3946	 Generator Loss: 0.9727
Epoch 1/2	 Discriminator Loss: 1.4724	 Generator Loss: 0.6430
Epoch 1/2	 Discriminator Loss: 1.4107	 Generator Loss: 0.5760
Epoch 1/2	 Discriminator Loss: 1.6046	 Generator Loss: 0.3989
Epoch 1/2	 Discriminator Loss: 1.7772	 Generator Loss: 0.2931
Epoch 1/2	 Discriminator Loss: 1.2397	 Generator Loss: 0.9424
Epoch 1/2	 Discriminator Loss: 1.4754	 Generator Loss: 0.6697
Epoch 1/2	 Discriminator Loss: 1.7219	 Generator Loss: 1.7435
Epoch 1/2	 Discriminator Loss: 1.4282	 Generator Loss: 0.6516
Epoch 1/2	 Discriminator Loss: 1.3675	 Generator Loss: 0.7057
Epoch 1/2	 Discriminator Loss: 1.4555	 Generator Loss: 1.1912
Epoch 1/2	 Discriminator Loss: 1.4028	 Generator Loss: 0.6351
Epoch 1/2	 Discriminator Loss: 1.8037	 Generator Loss: 0.2952
Epoch 1/2	 Discriminator Loss: 1.4442	 Generator Loss: 0.5213
Epoch 1/2	 Discriminator Loss: 1.3029	 Generator Loss: 0.7902
Epoch 1/2	 Discriminator Loss: 1.4009	 Generator Loss: 0.5816
Epoch 1/2	 Discriminator Loss: 1.5448	 Generator Loss: 0.5944
Epoch 1/2	 Discriminator Loss: 1.3470	 Generator Loss: 0.9406
Epoch 1/2	 Discriminator Loss: 1.3652	 Generator Loss: 0.7787
Epoch 1/2	 Discriminator Loss: 1.5198	 Generator Loss: 1.2339
Epoch 2/2	 Discriminator Loss: 1.3439	 Generator Loss: 0.8506
Epoch 2/2	 Discriminator Loss: 1.4362	 Generator Loss: 0.5486
Epoch 2/2	 Discriminator Loss: 1.4046	 Generator Loss: 0.5506
Epoch 2/2	 Discriminator Loss: 1.4431	 Generator Loss: 0.5584
Epoch 2/2	 Discriminator Loss: 1.3943	 Generator Loss: 0.7367
Epoch 2/2	 Discriminator Loss: 1.3490	 Generator Loss: 1.1063
Epoch 2/2	 Discriminator Loss: 1.4473	 Generator Loss: 0.7524
Epoch 2/2	 Discriminator Loss: 1.4010	 Generator Loss: 0.7705
Epoch 2/2	 Discriminator Loss: 1.4628	 Generator Loss: 0.4653
Epoch 2/2	 Discriminator Loss: 2.4473	 Generator Loss: 0.1598
Epoch 2/2	 Discriminator Loss: 1.4389	 Generator Loss: 0.5934
Epoch 2/2	 Discriminator Loss: 1.3791	 Generator Loss: 0.5843
Epoch 2/2	 Discriminator Loss: 1.4291	 Generator Loss: 0.9511
Epoch 2/2	 Discriminator Loss: 1.7134	 Generator Loss: 0.3631
Epoch 2/2	 Discriminator Loss: 1.4365	 Generator Loss: 0.5825
Epoch 2/2	 Discriminator Loss: 1.4316	 Generator Loss: 0.5966
Epoch 2/2	 Discriminator Loss: 1.3576	 Generator Loss: 0.7051
Epoch 2/2	 Discriminator Loss: 1.4079	 Generator Loss: 0.7365
Epoch 2/2	 Discriminator Loss: 1.4098	 Generator Loss: 0.6266
Epoch 2/2	 Discriminator Loss: 1.4814	 Generator Loss: 0.4393
Epoch 2/2	 Discriminator Loss: 1.3105	 Generator Loss: 0.6899
Epoch 2/2	 Discriminator Loss: 1.4272	 Generator Loss: 0.7409
Epoch 2/2	 Discriminator Loss: 1.3475	 Generator Loss: 0.7981
Epoch 2/2	 Discriminator Loss: 1.2829	 Generator Loss: 1.0391
Epoch 2/2	 Discriminator Loss: 1.4377	 Generator Loss: 0.5147
Epoch 2/2	 Discriminator Loss: 1.3079	 Generator Loss: 0.7685
Epoch 2/2	 Discriminator Loss: 1.4922	 Generator Loss: 0.5107
Epoch 2/2	 Discriminator Loss: 1.4109	 Generator Loss: 0.5856
Epoch 2/2	 Discriminator Loss: 1.3684	 Generator Loss: 0.9720
Epoch 2/2	 Discriminator Loss: 1.4107	 Generator Loss: 0.6141
Epoch 2/2	 Discriminator Loss: 1.4705	 Generator Loss: 0.5465
Epoch 2/2	 Discriminator Loss: 1.3457	 Generator Loss: 0.7793
Epoch 2/2	 Discriminator Loss: 1.8046	 Generator Loss: 0.3258
Epoch 2/2	 Discriminator Loss: 1.6080	 Generator Loss: 0.5186
Epoch 2/2	 Discriminator Loss: 2.0561	 Generator Loss: 0.2383
Epoch 2/2	 Discriminator Loss: 1.3964	 Generator Loss: 0.5682
Epoch 2/2	 Discriminator Loss: 1.5884	 Generator Loss: 0.8987
Epoch 2/2	 Discriminator Loss: 1.3805	 Generator Loss: 0.6260
Epoch 2/2	 Discriminator Loss: 1.9537	 Generator Loss: 1.7877
Epoch 2/2	 Discriminator Loss: 1.5612	 Generator Loss: 0.4094
Epoch 2/2	 Discriminator Loss: 1.6419	 Generator Loss: 0.3990
Epoch 2/2	 Discriminator Loss: 1.3218	 Generator Loss: 0.5733
Epoch 2/2	 Discriminator Loss: 1.3613	 Generator Loss: 0.7600
Epoch 2/2	 Discriminator Loss: 1.2616	 Generator Loss: 0.9167
Epoch 2/2	 Discriminator Loss: 1.5794	 Generator Loss: 0.7841
Epoch 2/2	 Discriminator Loss: 1.5681	 Generator Loss: 1.6409
Epoch 2/2	 Discriminator Loss: 1.5061	 Generator Loss: 0.7594
Epoch 2/2	 Discriminator Loss: 1.2284	 Generator Loss: 0.7495
Epoch 2/2	 Discriminator Loss: 1.4411	 Generator Loss: 0.5468
Epoch 2/2	 Discriminator Loss: 1.1937	 Generator Loss: 0.8714
Epoch 2/2	 Discriminator Loss: 1.4288	 Generator Loss: 0.6951
Epoch 2/2	 Discriminator Loss: 1.4027	 Generator Loss: 1.2204
Epoch 2/2	 Discriminator Loss: 1.4327	 Generator Loss: 0.5595
Epoch 2/2	 Discriminator Loss: 1.7468	 Generator Loss: 0.4037
Epoch 2/2	 Discriminator Loss: 1.1879	 Generator Loss: 0.8954
Epoch 2/2	 Discriminator Loss: 1.3939	 Generator Loss: 1.0422
Epoch 2/2	 Discriminator Loss: 1.4307	 Generator Loss: 0.8734
Epoch 2/2	 Discriminator Loss: 1.3896	 Generator Loss: 0.5361
Epoch 2/2	 Discriminator Loss: 2.5685	 Generator Loss: 0.1315
Epoch 2/2	 Discriminator Loss: 1.4375	 Generator Loss: 0.5305
Epoch 2/2	 Discriminator Loss: 1.7359	 Generator Loss: 0.3487
Epoch 2/2	 Discriminator Loss: 3.1453	 Generator Loss: 0.1210
Epoch 2/2	 Discriminator Loss: 1.4300	 Generator Loss: 0.5743
Epoch 2/2	 Discriminator Loss: 1.4393	 Generator Loss: 0.4956
Epoch 2/2	 Discriminator Loss: 1.2025	 Generator Loss: 1.2810
Epoch 2/2	 Discriminator Loss: 2.1295	 Generator Loss: 0.2240
Epoch 2/2	 Discriminator Loss: 0.9895	 Generator Loss: 1.3029
Epoch 2/2	 Discriminator Loss: 1.0768	 Generator Loss: 1.0075
Epoch 2/2	 Discriminator Loss: 1.6050	 Generator Loss: 1.9111
Epoch 2/2	 Discriminator Loss: 1.3093	 Generator Loss: 0.9237
Epoch 2/2	 Discriminator Loss: 1.3456	 Generator Loss: 0.6226
Epoch 2/2	 Discriminator Loss: 1.2731	 Generator Loss: 1.9421
Epoch 2/2	 Discriminator Loss: 2.0584	 Generator Loss: 0.2351
Epoch 2/2	 Discriminator Loss: 1.1422	 Generator Loss: 0.8149
Epoch 2/2	 Discriminator Loss: 1.4589	 Generator Loss: 0.6181
Epoch 2/2	 Discriminator Loss: 1.5267	 Generator Loss: 0.7028
Epoch 2/2	 Discriminator Loss: 1.6474	 Generator Loss: 0.4024
Epoch 2/2	 Discriminator Loss: 1.2884	 Generator Loss: 1.2382
Epoch 2/2	 Discriminator Loss: 2.2334	 Generator Loss: 0.2119
Epoch 2/2	 Discriminator Loss: 1.4792	 Generator Loss: 0.4705
Epoch 2/2	 Discriminator Loss: 1.7309	 Generator Loss: 0.3555
Epoch 2/2	 Discriminator Loss: 1.6671	 Generator Loss: 0.3862
Epoch 2/2	 Discriminator Loss: 2.1142	 Generator Loss: 0.2691
Epoch 2/2	 Discriminator Loss: 1.3056	 Generator Loss: 0.6030
Epoch 2/2	 Discriminator Loss: 1.7238	 Generator Loss: 0.3360
Epoch 2/2	 Discriminator Loss: 1.5073	 Generator Loss: 0.6982
Epoch 2/2	 Discriminator Loss: 1.3977	 Generator Loss: 0.5517
Epoch 2/2	 Discriminator Loss: 1.4552	 Generator Loss: 0.5745
Epoch 2/2	 Discriminator Loss: 1.3307	 Generator Loss: 2.1852
Epoch 2/2	 Discriminator Loss: 1.6309	 Generator Loss: 0.3874
Epoch 2/2	 Discriminator Loss: 1.6117	 Generator Loss: 0.3998
Epoch 2/2	 Discriminator Loss: 1.8012	 Generator Loss: 2.4621
Epoch 2/2	 Discriminator Loss: 1.1793	 Generator Loss: 0.9761
Epoch 2/2	 Discriminator Loss: 1.3199	 Generator Loss: 0.7900

CelebA

Run your GANs on CelebA. It will take around 20 minutes on the average GPU to run one epoch. You can run the whole epoch or stop when it starts to generate realistic faces.

In [22]:
batch_size = 32
z_dim = 100
learning_rate = 0.002
beta1 = 0.5


"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
epochs = 1

celeba_dataset = helper.Dataset('celeba', glob(os.path.join(data_dir, 'img_align_celeba/*.jpg')))
with tf.Graph().as_default():
    train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
          celeba_dataset.shape, celeba_dataset.image_mode)
Epoch 1/1	 Discriminator Loss: 4.9938	 Generator Loss: 0.0212
Epoch 1/1	 Discriminator Loss: 2.2524	 Generator Loss: 0.9333
Epoch 1/1	 Discriminator Loss: 1.9616	 Generator Loss: 1.6128
Epoch 1/1	 Discriminator Loss: 1.7680	 Generator Loss: 1.4711
Epoch 1/1	 Discriminator Loss: 2.3017	 Generator Loss: 1.4792
Epoch 1/1	 Discriminator Loss: 1.5362	 Generator Loss: 0.6820
Epoch 1/1	 Discriminator Loss: 2.3747	 Generator Loss: 0.6959
Epoch 1/1	 Discriminator Loss: 1.5089	 Generator Loss: 0.6490
Epoch 1/1	 Discriminator Loss: 1.9161	 Generator Loss: 0.5503
Epoch 1/1	 Discriminator Loss: 1.7962	 Generator Loss: 0.6072
Epoch 1/1	 Discriminator Loss: 1.7056	 Generator Loss: 0.6892
Epoch 1/1	 Discriminator Loss: 1.7661	 Generator Loss: 0.4895
Epoch 1/1	 Discriminator Loss: 1.5819	 Generator Loss: 0.5958
Epoch 1/1	 Discriminator Loss: 1.5388	 Generator Loss: 0.7724
Epoch 1/1	 Discriminator Loss: 1.5308	 Generator Loss: 0.8600
Epoch 1/1	 Discriminator Loss: 1.5055	 Generator Loss: 0.6651
Epoch 1/1	 Discriminator Loss: 1.5336	 Generator Loss: 0.7294
Epoch 1/1	 Discriminator Loss: 1.6674	 Generator Loss: 0.7454
Epoch 1/1	 Discriminator Loss: 1.5211	 Generator Loss: 0.7525
Epoch 1/1	 Discriminator Loss: 1.6029	 Generator Loss: 0.6155
Epoch 1/1	 Discriminator Loss: 1.6724	 Generator Loss: 0.5642
Epoch 1/1	 Discriminator Loss: 1.6441	 Generator Loss: 0.5570
Epoch 1/1	 Discriminator Loss: 1.5412	 Generator Loss: 0.6526
Epoch 1/1	 Discriminator Loss: 1.4653	 Generator Loss: 0.6925
Epoch 1/1	 Discriminator Loss: 1.4509	 Generator Loss: 0.6006
Epoch 1/1	 Discriminator Loss: 1.5328	 Generator Loss: 0.8459
Epoch 1/1	 Discriminator Loss: 1.4821	 Generator Loss: 0.7388
Epoch 1/1	 Discriminator Loss: 1.5324	 Generator Loss: 0.7461
Epoch 1/1	 Discriminator Loss: 1.9325	 Generator Loss: 0.6623
Epoch 1/1	 Discriminator Loss: 1.5902	 Generator Loss: 1.0296
Epoch 1/1	 Discriminator Loss: 1.5336	 Generator Loss: 0.7161
Epoch 1/1	 Discriminator Loss: 1.7206	 Generator Loss: 0.5606
Epoch 1/1	 Discriminator Loss: 1.5090	 Generator Loss: 0.6450
Epoch 1/1	 Discriminator Loss: 1.4118	 Generator Loss: 0.8053
Epoch 1/1	 Discriminator Loss: 1.5491	 Generator Loss: 0.7687
Epoch 1/1	 Discriminator Loss: 1.4467	 Generator Loss: 0.8728
Epoch 1/1	 Discriminator Loss: 1.6107	 Generator Loss: 0.6270
Epoch 1/1	 Discriminator Loss: 1.5253	 Generator Loss: 0.7644
Epoch 1/1	 Discriminator Loss: 1.4368	 Generator Loss: 0.8802
Epoch 1/1	 Discriminator Loss: 1.4408	 Generator Loss: 0.6833
Epoch 1/1	 Discriminator Loss: 1.4752	 Generator Loss: 0.6736
Epoch 1/1	 Discriminator Loss: 1.3604	 Generator Loss: 1.0112
Epoch 1/1	 Discriminator Loss: 1.5194	 Generator Loss: 0.8068
Epoch 1/1	 Discriminator Loss: 1.3492	 Generator Loss: 0.7904
Epoch 1/1	 Discriminator Loss: 1.5064	 Generator Loss: 0.6477
Epoch 1/1	 Discriminator Loss: 1.4917	 Generator Loss: 0.8771
Epoch 1/1	 Discriminator Loss: 1.5463	 Generator Loss: 0.7750
Epoch 1/1	 Discriminator Loss: 1.4954	 Generator Loss: 0.7188
Epoch 1/1	 Discriminator Loss: 1.4457	 Generator Loss: 0.9244
Epoch 1/1	 Discriminator Loss: 1.4903	 Generator Loss: 0.6672
Epoch 1/1	 Discriminator Loss: 1.5296	 Generator Loss: 0.8105
Epoch 1/1	 Discriminator Loss: 1.6772	 Generator Loss: 0.7886
Epoch 1/1	 Discriminator Loss: 1.4728	 Generator Loss: 0.9148
Epoch 1/1	 Discriminator Loss: 1.5589	 Generator Loss: 0.5998
Epoch 1/1	 Discriminator Loss: 1.7137	 Generator Loss: 0.5091
Epoch 1/1	 Discriminator Loss: 1.5635	 Generator Loss: 0.5753
Epoch 1/1	 Discriminator Loss: 1.5977	 Generator Loss: 0.6694
Epoch 1/1	 Discriminator Loss: 1.5360	 Generator Loss: 0.6460
Epoch 1/1	 Discriminator Loss: 1.4809	 Generator Loss: 0.7786
Epoch 1/1	 Discriminator Loss: 1.4257	 Generator Loss: 0.8358
Epoch 1/1	 Discriminator Loss: 1.3790	 Generator Loss: 0.7098
Epoch 1/1	 Discriminator Loss: 1.5555	 Generator Loss: 0.7294
Epoch 1/1	 Discriminator Loss: 1.5157	 Generator Loss: 0.7038
Epoch 1/1	 Discriminator Loss: 1.5990	 Generator Loss: 0.5108
Epoch 1/1	 Discriminator Loss: 1.4563	 Generator Loss: 0.7646
Epoch 1/1	 Discriminator Loss: 1.4952	 Generator Loss: 1.0132
Epoch 1/1	 Discriminator Loss: 1.5787	 Generator Loss: 0.5970
Epoch 1/1	 Discriminator Loss: 1.4362	 Generator Loss: 0.6583
Epoch 1/1	 Discriminator Loss: 1.5141	 Generator Loss: 1.0257
Epoch 1/1	 Discriminator Loss: 1.4775	 Generator Loss: 0.6648
Epoch 1/1	 Discriminator Loss: 1.4294	 Generator Loss: 0.7620
Epoch 1/1	 Discriminator Loss: 1.4720	 Generator Loss: 0.7062
Epoch 1/1	 Discriminator Loss: 1.4360	 Generator Loss: 0.7682
Epoch 1/1	 Discriminator Loss: 1.5191	 Generator Loss: 0.7087
Epoch 1/1	 Discriminator Loss: 1.3913	 Generator Loss: 0.7508
Epoch 1/1	 Discriminator Loss: 1.5073	 Generator Loss: 0.7635
Epoch 1/1	 Discriminator Loss: 1.4919	 Generator Loss: 0.8708
Epoch 1/1	 Discriminator Loss: 1.4409	 Generator Loss: 0.7208
Epoch 1/1	 Discriminator Loss: 1.4101	 Generator Loss: 0.7337
Epoch 1/1	 Discriminator Loss: 1.8699	 Generator Loss: 0.4695
Epoch 1/1	 Discriminator Loss: 1.7797	 Generator Loss: 0.7971
Epoch 1/1	 Discriminator Loss: 1.5426	 Generator Loss: 0.5744
Epoch 1/1	 Discriminator Loss: 1.5124	 Generator Loss: 0.5708
Epoch 1/1	 Discriminator Loss: 1.5594	 Generator Loss: 0.6161
Epoch 1/1	 Discriminator Loss: 1.3559	 Generator Loss: 0.8338
Epoch 1/1	 Discriminator Loss: 1.4347	 Generator Loss: 0.7166
Epoch 1/1	 Discriminator Loss: 1.5771	 Generator Loss: 0.6129
Epoch 1/1	 Discriminator Loss: 1.5073	 Generator Loss: 0.7024
Epoch 1/1	 Discriminator Loss: 1.5041	 Generator Loss: 0.7147
Epoch 1/1	 Discriminator Loss: 1.4610	 Generator Loss: 0.6829
Epoch 1/1	 Discriminator Loss: 1.4475	 Generator Loss: 0.8848
Epoch 1/1	 Discriminator Loss: 1.4928	 Generator Loss: 0.7435
Epoch 1/1	 Discriminator Loss: 1.4611	 Generator Loss: 0.9190
Epoch 1/1	 Discriminator Loss: 1.5144	 Generator Loss: 1.0383
Epoch 1/1	 Discriminator Loss: 1.3814	 Generator Loss: 0.6970
Epoch 1/1	 Discriminator Loss: 1.3341	 Generator Loss: 0.7500
Epoch 1/1	 Discriminator Loss: 1.5109	 Generator Loss: 0.7767
Epoch 1/1	 Discriminator Loss: 1.3906	 Generator Loss: 0.6792
Epoch 1/1	 Discriminator Loss: 1.4756	 Generator Loss: 0.5318
Epoch 1/1	 Discriminator Loss: 1.3886	 Generator Loss: 0.8618
Epoch 1/1	 Discriminator Loss: 1.3870	 Generator Loss: 1.1000
Epoch 1/1	 Discriminator Loss: 1.4342	 Generator Loss: 0.8283
Epoch 1/1	 Discriminator Loss: 1.4017	 Generator Loss: 0.7957
Epoch 1/1	 Discriminator Loss: 1.5024	 Generator Loss: 0.7409
Epoch 1/1	 Discriminator Loss: 1.3932	 Generator Loss: 0.8889
Epoch 1/1	 Discriminator Loss: 1.3965	 Generator Loss: 0.9803
Epoch 1/1	 Discriminator Loss: 1.4399	 Generator Loss: 0.8259
Epoch 1/1	 Discriminator Loss: 1.5051	 Generator Loss: 0.6887
Epoch 1/1	 Discriminator Loss: 1.6133	 Generator Loss: 0.5902
Epoch 1/1	 Discriminator Loss: 1.4363	 Generator Loss: 0.8869
Epoch 1/1	 Discriminator Loss: 1.4100	 Generator Loss: 0.7490
Epoch 1/1	 Discriminator Loss: 1.3678	 Generator Loss: 0.9759
Epoch 1/1	 Discriminator Loss: 1.3872	 Generator Loss: 0.8661
Epoch 1/1	 Discriminator Loss: 1.4055	 Generator Loss: 0.8813
Epoch 1/1	 Discriminator Loss: 1.5849	 Generator Loss: 0.6609
Epoch 1/1	 Discriminator Loss: 1.4670	 Generator Loss: 0.6997
Epoch 1/1	 Discriminator Loss: 1.4191	 Generator Loss: 0.8879
Epoch 1/1	 Discriminator Loss: 1.3588	 Generator Loss: 0.7477
Epoch 1/1	 Discriminator Loss: 1.3878	 Generator Loss: 0.6520
Epoch 1/1	 Discriminator Loss: 1.3716	 Generator Loss: 0.7545
Epoch 1/1	 Discriminator Loss: 1.4617	 Generator Loss: 0.6718
Epoch 1/1	 Discriminator Loss: 1.3576	 Generator Loss: 0.8901
Epoch 1/1	 Discriminator Loss: 1.3989	 Generator Loss: 0.7856
Epoch 1/1	 Discriminator Loss: 1.4861	 Generator Loss: 0.7896
Epoch 1/1	 Discriminator Loss: 1.4526	 Generator Loss: 0.8606
Epoch 1/1	 Discriminator Loss: 1.4312	 Generator Loss: 0.6807
Epoch 1/1	 Discriminator Loss: 1.3653	 Generator Loss: 1.0297
Epoch 1/1	 Discriminator Loss: 1.4994	 Generator Loss: 0.9676
Epoch 1/1	 Discriminator Loss: 1.3360	 Generator Loss: 0.7336
Epoch 1/1	 Discriminator Loss: 1.3707	 Generator Loss: 1.1442
Epoch 1/1	 Discriminator Loss: 1.5458	 Generator Loss: 1.0120
Epoch 1/1	 Discriminator Loss: 1.4408	 Generator Loss: 0.7625
Epoch 1/1	 Discriminator Loss: 1.4653	 Generator Loss: 0.8971
Epoch 1/1	 Discriminator Loss: 1.4017	 Generator Loss: 0.8502
Epoch 1/1	 Discriminator Loss: 1.4515	 Generator Loss: 0.8753
Epoch 1/1	 Discriminator Loss: 1.4029	 Generator Loss: 0.7063
Epoch 1/1	 Discriminator Loss: 1.4973	 Generator Loss: 1.2245
Epoch 1/1	 Discriminator Loss: 1.3584	 Generator Loss: 0.8821
Epoch 1/1	 Discriminator Loss: 1.4260	 Generator Loss: 0.7490
Epoch 1/1	 Discriminator Loss: 1.4633	 Generator Loss: 0.7435
Epoch 1/1	 Discriminator Loss: 1.4442	 Generator Loss: 0.6918
Epoch 1/1	 Discriminator Loss: 1.4138	 Generator Loss: 0.7583
Epoch 1/1	 Discriminator Loss: 1.4276	 Generator Loss: 0.7259
Epoch 1/1	 Discriminator Loss: 1.4333	 Generator Loss: 0.7330
Epoch 1/1	 Discriminator Loss: 1.5780	 Generator Loss: 0.6169
Epoch 1/1	 Discriminator Loss: 1.4186	 Generator Loss: 0.6276
Epoch 1/1	 Discriminator Loss: 1.5309	 Generator Loss: 0.6732
Epoch 1/1	 Discriminator Loss: 1.5359	 Generator Loss: 0.5365
Epoch 1/1	 Discriminator Loss: 1.4171	 Generator Loss: 0.6826
Epoch 1/1	 Discriminator Loss: 1.5471	 Generator Loss: 0.8601
Epoch 1/1	 Discriminator Loss: 1.3785	 Generator Loss: 0.6466
Epoch 1/1	 Discriminator Loss: 1.3813	 Generator Loss: 0.8717
Epoch 1/1	 Discriminator Loss: 1.3851	 Generator Loss: 0.6904
Epoch 1/1	 Discriminator Loss: 1.5281	 Generator Loss: 0.6154
Epoch 1/1	 Discriminator Loss: 1.3820	 Generator Loss: 0.8195
Epoch 1/1	 Discriminator Loss: 1.4027	 Generator Loss: 0.8097
Epoch 1/1	 Discriminator Loss: 1.3846	 Generator Loss: 0.6618
Epoch 1/1	 Discriminator Loss: 1.4235	 Generator Loss: 0.7218
Epoch 1/1	 Discriminator Loss: 1.3421	 Generator Loss: 0.9240
Epoch 1/1	 Discriminator Loss: 1.4283	 Generator Loss: 0.8613
Epoch 1/1	 Discriminator Loss: 1.4808	 Generator Loss: 0.8761
Epoch 1/1	 Discriminator Loss: 1.4777	 Generator Loss: 0.7213
Epoch 1/1	 Discriminator Loss: 1.4126	 Generator Loss: 0.7654
Epoch 1/1	 Discriminator Loss: 1.4484	 Generator Loss: 0.7843
Epoch 1/1	 Discriminator Loss: 1.4266	 Generator Loss: 0.8314
Epoch 1/1	 Discriminator Loss: 1.3853	 Generator Loss: 0.7066
Epoch 1/1	 Discriminator Loss: 1.4708	 Generator Loss: 0.8803
Epoch 1/1	 Discriminator Loss: 1.4543	 Generator Loss: 0.6850
Epoch 1/1	 Discriminator Loss: 1.4418	 Generator Loss: 0.7853
Epoch 1/1	 Discriminator Loss: 1.3968	 Generator Loss: 0.8888
Epoch 1/1	 Discriminator Loss: 1.4038	 Generator Loss: 0.7417
Epoch 1/1	 Discriminator Loss: 1.3803	 Generator Loss: 0.8705
Epoch 1/1	 Discriminator Loss: 1.3875	 Generator Loss: 0.6830
Epoch 1/1	 Discriminator Loss: 1.3890	 Generator Loss: 0.9431
Epoch 1/1	 Discriminator Loss: 1.3874	 Generator Loss: 0.8158
Epoch 1/1	 Discriminator Loss: 1.3562	 Generator Loss: 0.9146
Epoch 1/1	 Discriminator Loss: 1.4172	 Generator Loss: 0.9656
Epoch 1/1	 Discriminator Loss: 1.5187	 Generator Loss: 0.6537
Epoch 1/1	 Discriminator Loss: 1.4332	 Generator Loss: 0.9065
Epoch 1/1	 Discriminator Loss: 1.3596	 Generator Loss: 0.6992
Epoch 1/1	 Discriminator Loss: 1.4664	 Generator Loss: 0.7114
Epoch 1/1	 Discriminator Loss: 1.3576	 Generator Loss: 0.7523
Epoch 1/1	 Discriminator Loss: 1.4127	 Generator Loss: 0.7813
Epoch 1/1	 Discriminator Loss: 1.4404	 Generator Loss: 0.7795
Epoch 1/1	 Discriminator Loss: 1.4685	 Generator Loss: 0.6942
Epoch 1/1	 Discriminator Loss: 1.4418	 Generator Loss: 0.7991
Epoch 1/1	 Discriminator Loss: 1.4018	 Generator Loss: 0.7342
Epoch 1/1	 Discriminator Loss: 1.3777	 Generator Loss: 0.9086
Epoch 1/1	 Discriminator Loss: 1.3923	 Generator Loss: 0.8275
Epoch 1/1	 Discriminator Loss: 1.3945	 Generator Loss: 0.7042
Epoch 1/1	 Discriminator Loss: 1.4225	 Generator Loss: 0.7528
Epoch 1/1	 Discriminator Loss: 1.4511	 Generator Loss: 0.7917
Epoch 1/1	 Discriminator Loss: 1.5128	 Generator Loss: 0.5588
Epoch 1/1	 Discriminator Loss: 1.4066	 Generator Loss: 0.6819
Epoch 1/1	 Discriminator Loss: 1.3979	 Generator Loss: 0.6943
Epoch 1/1	 Discriminator Loss: 1.4971	 Generator Loss: 0.6842
Epoch 1/1	 Discriminator Loss: 1.3642	 Generator Loss: 0.7392
Epoch 1/1	 Discriminator Loss: 1.3750	 Generator Loss: 0.6598
Epoch 1/1	 Discriminator Loss: 1.3422	 Generator Loss: 0.8035
Epoch 1/1	 Discriminator Loss: 1.4554	 Generator Loss: 0.6032
Epoch 1/1	 Discriminator Loss: 1.4391	 Generator Loss: 0.6685
Epoch 1/1	 Discriminator Loss: 1.4223	 Generator Loss: 0.7423
Epoch 1/1	 Discriminator Loss: 1.3715	 Generator Loss: 0.7540
Epoch 1/1	 Discriminator Loss: 1.4165	 Generator Loss: 0.7284
Epoch 1/1	 Discriminator Loss: 1.5184	 Generator Loss: 0.6224
Epoch 1/1	 Discriminator Loss: 1.4302	 Generator Loss: 0.8100
Epoch 1/1	 Discriminator Loss: 1.3719	 Generator Loss: 0.8149
Epoch 1/1	 Discriminator Loss: 1.4195	 Generator Loss: 0.7628
Epoch 1/1	 Discriminator Loss: 1.4801	 Generator Loss: 0.7693
Epoch 1/1	 Discriminator Loss: 1.4577	 Generator Loss: 0.7163
Epoch 1/1	 Discriminator Loss: 1.4466	 Generator Loss: 0.7330
Epoch 1/1	 Discriminator Loss: 1.3659	 Generator Loss: 0.8053
Epoch 1/1	 Discriminator Loss: 1.3964	 Generator Loss: 0.7615
Epoch 1/1	 Discriminator Loss: 1.3762	 Generator Loss: 0.7725
Epoch 1/1	 Discriminator Loss: 1.4651	 Generator Loss: 0.6457
Epoch 1/1	 Discriminator Loss: 1.3834	 Generator Loss: 0.7078
Epoch 1/1	 Discriminator Loss: 1.3814	 Generator Loss: 0.8083
Epoch 1/1	 Discriminator Loss: 1.3998	 Generator Loss: 0.7348
Epoch 1/1	 Discriminator Loss: 1.5458	 Generator Loss: 0.5367
Epoch 1/1	 Discriminator Loss: 1.3810	 Generator Loss: 0.8022
Epoch 1/1	 Discriminator Loss: 1.5207	 Generator Loss: 0.6506
Epoch 1/1	 Discriminator Loss: 1.3775	 Generator Loss: 0.7717
Epoch 1/1	 Discriminator Loss: 1.4106	 Generator Loss: 0.7152
Epoch 1/1	 Discriminator Loss: 1.4026	 Generator Loss: 0.8493
Epoch 1/1	 Discriminator Loss: 1.4198	 Generator Loss: 0.7052
Epoch 1/1	 Discriminator Loss: 1.4137	 Generator Loss: 0.8435
Epoch 1/1	 Discriminator Loss: 1.3675	 Generator Loss: 0.9091
Epoch 1/1	 Discriminator Loss: 1.3909	 Generator Loss: 0.7783
Epoch 1/1	 Discriminator Loss: 1.3616	 Generator Loss: 0.9575
Epoch 1/1	 Discriminator Loss: 1.4643	 Generator Loss: 0.6318
Epoch 1/1	 Discriminator Loss: 1.3920	 Generator Loss: 0.7345
Epoch 1/1	 Discriminator Loss: 1.4614	 Generator Loss: 0.6992
Epoch 1/1	 Discriminator Loss: 1.4649	 Generator Loss: 0.6747
Epoch 1/1	 Discriminator Loss: 1.3758	 Generator Loss: 0.7952
Epoch 1/1	 Discriminator Loss: 1.4029	 Generator Loss: 0.7095
Epoch 1/1	 Discriminator Loss: 1.4081	 Generator Loss: 0.8645
Epoch 1/1	 Discriminator Loss: 1.4275	 Generator Loss: 0.7505
Epoch 1/1	 Discriminator Loss: 1.3836	 Generator Loss: 0.7635
Epoch 1/1	 Discriminator Loss: 1.3529	 Generator Loss: 0.8037
Epoch 1/1	 Discriminator Loss: 1.4014	 Generator Loss: 0.7778
Epoch 1/1	 Discriminator Loss: 1.4077	 Generator Loss: 0.7392
Epoch 1/1	 Discriminator Loss: 1.4106	 Generator Loss: 0.6952
Epoch 1/1	 Discriminator Loss: 1.4040	 Generator Loss: 0.7235
Epoch 1/1	 Discriminator Loss: 1.3918	 Generator Loss: 0.7724
Epoch 1/1	 Discriminator Loss: 1.3876	 Generator Loss: 0.7701
Epoch 1/1	 Discriminator Loss: 1.3661	 Generator Loss: 0.8346
Epoch 1/1	 Discriminator Loss: 1.3710	 Generator Loss: 0.7696
Epoch 1/1	 Discriminator Loss: 1.3880	 Generator Loss: 0.7471
Epoch 1/1	 Discriminator Loss: 1.3803	 Generator Loss: 0.8213
Epoch 1/1	 Discriminator Loss: 1.4453	 Generator Loss: 0.7818
Epoch 1/1	 Discriminator Loss: 1.3667	 Generator Loss: 0.7505
Epoch 1/1	 Discriminator Loss: 1.3463	 Generator Loss: 0.8079
Epoch 1/1	 Discriminator Loss: 1.4252	 Generator Loss: 0.7416
Epoch 1/1	 Discriminator Loss: 1.4186	 Generator Loss: 0.8334
Epoch 1/1	 Discriminator Loss: 1.3815	 Generator Loss: 0.7542
Epoch 1/1	 Discriminator Loss: 1.4008	 Generator Loss: 0.7359
Epoch 1/1	 Discriminator Loss: 1.3561	 Generator Loss: 0.8096
Epoch 1/1	 Discriminator Loss: 1.3344	 Generator Loss: 0.7719
Epoch 1/1	 Discriminator Loss: 1.4070	 Generator Loss: 0.8056
Epoch 1/1	 Discriminator Loss: 1.3710	 Generator Loss: 0.8063
Epoch 1/1	 Discriminator Loss: 1.4943	 Generator Loss: 0.7521
Epoch 1/1	 Discriminator Loss: 1.3819	 Generator Loss: 0.8571
Epoch 1/1	 Discriminator Loss: 1.4075	 Generator Loss: 0.7732
Epoch 1/1	 Discriminator Loss: 1.3801	 Generator Loss: 0.8566
Epoch 1/1	 Discriminator Loss: 1.4066	 Generator Loss: 0.7730
Epoch 1/1	 Discriminator Loss: 1.3920	 Generator Loss: 0.7668
Epoch 1/1	 Discriminator Loss: 1.4078	 Generator Loss: 0.7325
Epoch 1/1	 Discriminator Loss: 1.4960	 Generator Loss: 0.6916
Epoch 1/1	 Discriminator Loss: 1.4414	 Generator Loss: 0.6930
Epoch 1/1	 Discriminator Loss: 1.3212	 Generator Loss: 0.8098
Epoch 1/1	 Discriminator Loss: 1.3682	 Generator Loss: 0.7959
Epoch 1/1	 Discriminator Loss: 1.4031	 Generator Loss: 0.7153
Epoch 1/1	 Discriminator Loss: 1.4123	 Generator Loss: 0.8792
Epoch 1/1	 Discriminator Loss: 1.4334	 Generator Loss: 0.8509
Epoch 1/1	 Discriminator Loss: 1.3964	 Generator Loss: 0.7760
Epoch 1/1	 Discriminator Loss: 1.4269	 Generator Loss: 0.6484
Epoch 1/1	 Discriminator Loss: 1.4053	 Generator Loss: 0.7953
Epoch 1/1	 Discriminator Loss: 1.3407	 Generator Loss: 0.7693
Epoch 1/1	 Discriminator Loss: 1.4000	 Generator Loss: 0.7895
Epoch 1/1	 Discriminator Loss: 1.4424	 Generator Loss: 0.7438
Epoch 1/1	 Discriminator Loss: 1.3836	 Generator Loss: 0.9048
Epoch 1/1	 Discriminator Loss: 1.3515	 Generator Loss: 0.7346
Epoch 1/1	 Discriminator Loss: 1.3921	 Generator Loss: 0.7977
Epoch 1/1	 Discriminator Loss: 1.3692	 Generator Loss: 0.8696
Epoch 1/1	 Discriminator Loss: 1.4607	 Generator Loss: 0.7798
Epoch 1/1	 Discriminator Loss: 1.3992	 Generator Loss: 0.7738
Epoch 1/1	 Discriminator Loss: 1.4170	 Generator Loss: 0.8463
Epoch 1/1	 Discriminator Loss: 1.3951	 Generator Loss: 0.7302
Epoch 1/1	 Discriminator Loss: 1.3647	 Generator Loss: 0.8800
Epoch 1/1	 Discriminator Loss: 1.3973	 Generator Loss: 0.7919
Epoch 1/1	 Discriminator Loss: 1.3463	 Generator Loss: 0.8315
Epoch 1/1	 Discriminator Loss: 1.3910	 Generator Loss: 0.8147
Epoch 1/1	 Discriminator Loss: 1.3824	 Generator Loss: 0.8235
Epoch 1/1	 Discriminator Loss: 1.3810	 Generator Loss: 0.7827
Epoch 1/1	 Discriminator Loss: 1.4313	 Generator Loss: 0.7809
Epoch 1/1	 Discriminator Loss: 1.3850	 Generator Loss: 0.7840
Epoch 1/1	 Discriminator Loss: 1.3969	 Generator Loss: 0.7247
Epoch 1/1	 Discriminator Loss: 1.3484	 Generator Loss: 0.8657
Epoch 1/1	 Discriminator Loss: 1.4223	 Generator Loss: 0.8219
Epoch 1/1	 Discriminator Loss: 1.4359	 Generator Loss: 0.8228
Epoch 1/1	 Discriminator Loss: 1.3791	 Generator Loss: 0.7556
Epoch 1/1	 Discriminator Loss: 1.3944	 Generator Loss: 0.7947
Epoch 1/1	 Discriminator Loss: 1.3859	 Generator Loss: 0.7801
Epoch 1/1	 Discriminator Loss: 1.3607	 Generator Loss: 0.7419
Epoch 1/1	 Discriminator Loss: 1.4060	 Generator Loss: 0.7603
Epoch 1/1	 Discriminator Loss: 1.3719	 Generator Loss: 0.6978
Epoch 1/1	 Discriminator Loss: 1.3956	 Generator Loss: 0.7481
Epoch 1/1	 Discriminator Loss: 1.3758	 Generator Loss: 0.8582
Epoch 1/1	 Discriminator Loss: 1.4296	 Generator Loss: 0.7595
Epoch 1/1	 Discriminator Loss: 1.4036	 Generator Loss: 0.8435
Epoch 1/1	 Discriminator Loss: 1.4197	 Generator Loss: 0.8007
Epoch 1/1	 Discriminator Loss: 1.3648	 Generator Loss: 0.7224
Epoch 1/1	 Discriminator Loss: 1.4148	 Generator Loss: 0.8308
Epoch 1/1	 Discriminator Loss: 1.3941	 Generator Loss: 0.6989
Epoch 1/1	 Discriminator Loss: 1.4219	 Generator Loss: 0.8398
Epoch 1/1	 Discriminator Loss: 1.4453	 Generator Loss: 0.7595
Epoch 1/1	 Discriminator Loss: 1.3560	 Generator Loss: 0.7949
Epoch 1/1	 Discriminator Loss: 1.3844	 Generator Loss: 0.8210
Epoch 1/1	 Discriminator Loss: 1.3978	 Generator Loss: 0.8072
Epoch 1/1	 Discriminator Loss: 1.4057	 Generator Loss: 0.7520
Epoch 1/1	 Discriminator Loss: 1.3922	 Generator Loss: 0.7628
Epoch 1/1	 Discriminator Loss: 1.3913	 Generator Loss: 0.7626
Epoch 1/1	 Discriminator Loss: 1.3762	 Generator Loss: 0.7405
Epoch 1/1	 Discriminator Loss: 1.3917	 Generator Loss: 0.7311
Epoch 1/1	 Discriminator Loss: 1.4063	 Generator Loss: 0.7834
Epoch 1/1	 Discriminator Loss: 1.3965	 Generator Loss: 0.8065
Epoch 1/1	 Discriminator Loss: 1.4247	 Generator Loss: 0.7845
Epoch 1/1	 Discriminator Loss: 1.3848	 Generator Loss: 0.7638
Epoch 1/1	 Discriminator Loss: 1.3712	 Generator Loss: 0.8223
Epoch 1/1	 Discriminator Loss: 1.3644	 Generator Loss: 0.7477
Epoch 1/1	 Discriminator Loss: 1.3961	 Generator Loss: 0.8687
Epoch 1/1	 Discriminator Loss: 1.3750	 Generator Loss: 0.7695
Epoch 1/1	 Discriminator Loss: 1.3903	 Generator Loss: 0.8121
Epoch 1/1	 Discriminator Loss: 1.3741	 Generator Loss: 0.8069
Epoch 1/1	 Discriminator Loss: 1.3989	 Generator Loss: 0.7781
Epoch 1/1	 Discriminator Loss: 1.3767	 Generator Loss: 0.8494
Epoch 1/1	 Discriminator Loss: 1.4022	 Generator Loss: 0.7562
Epoch 1/1	 Discriminator Loss: 1.4030	 Generator Loss: 0.7202
Epoch 1/1	 Discriminator Loss: 1.4098	 Generator Loss: 0.7957
Epoch 1/1	 Discriminator Loss: 1.3991	 Generator Loss: 0.7811
Epoch 1/1	 Discriminator Loss: 1.3840	 Generator Loss: 0.7339
Epoch 1/1	 Discriminator Loss: 1.3640	 Generator Loss: 0.7230
Epoch 1/1	 Discriminator Loss: 1.3808	 Generator Loss: 0.8188
Epoch 1/1	 Discriminator Loss: 1.4139	 Generator Loss: 0.7735
Epoch 1/1	 Discriminator Loss: 1.3714	 Generator Loss: 0.7875
Epoch 1/1	 Discriminator Loss: 1.4140	 Generator Loss: 0.7901
Epoch 1/1	 Discriminator Loss: 1.3565	 Generator Loss: 0.7675
Epoch 1/1	 Discriminator Loss: 1.4340	 Generator Loss: 0.7740
Epoch 1/1	 Discriminator Loss: 1.3958	 Generator Loss: 0.7745
Epoch 1/1	 Discriminator Loss: 1.4282	 Generator Loss: 0.7569
Epoch 1/1	 Discriminator Loss: 1.4413	 Generator Loss: 0.7242
Epoch 1/1	 Discriminator Loss: 1.3634	 Generator Loss: 0.8059
Epoch 1/1	 Discriminator Loss: 1.4041	 Generator Loss: 0.7636
Epoch 1/1	 Discriminator Loss: 1.4325	 Generator Loss: 0.7801
Epoch 1/1	 Discriminator Loss: 1.4605	 Generator Loss: 0.8183
Epoch 1/1	 Discriminator Loss: 1.4449	 Generator Loss: 0.8001
Epoch 1/1	 Discriminator Loss: 1.3805	 Generator Loss: 0.7662
Epoch 1/1	 Discriminator Loss: 1.3843	 Generator Loss: 0.7557
Epoch 1/1	 Discriminator Loss: 1.3957	 Generator Loss: 0.7958
Epoch 1/1	 Discriminator Loss: 1.3984	 Generator Loss: 0.7533
Epoch 1/1	 Discriminator Loss: 1.3648	 Generator Loss: 0.7911
Epoch 1/1	 Discriminator Loss: 1.4200	 Generator Loss: 0.7902
Epoch 1/1	 Discriminator Loss: 1.4389	 Generator Loss: 0.7440
Epoch 1/1	 Discriminator Loss: 1.4321	 Generator Loss: 0.7658
Epoch 1/1	 Discriminator Loss: 1.3892	 Generator Loss: 0.7893
Epoch 1/1	 Discriminator Loss: 1.3809	 Generator Loss: 0.8186
Epoch 1/1	 Discriminator Loss: 1.4077	 Generator Loss: 0.7531
Epoch 1/1	 Discriminator Loss: 1.4330	 Generator Loss: 0.7010
Epoch 1/1	 Discriminator Loss: 1.4054	 Generator Loss: 0.7697
Epoch 1/1	 Discriminator Loss: 1.3940	 Generator Loss: 0.7450
Epoch 1/1	 Discriminator Loss: 1.3992	 Generator Loss: 0.8345
Epoch 1/1	 Discriminator Loss: 1.3809	 Generator Loss: 0.7847
Epoch 1/1	 Discriminator Loss: 1.4019	 Generator Loss: 0.7637
Epoch 1/1	 Discriminator Loss: 1.3906	 Generator Loss: 0.7867
Epoch 1/1	 Discriminator Loss: 1.3843	 Generator Loss: 0.7977
Epoch 1/1	 Discriminator Loss: 1.3862	 Generator Loss: 0.7995
Epoch 1/1	 Discriminator Loss: 1.4160	 Generator Loss: 0.7013
Epoch 1/1	 Discriminator Loss: 1.3711	 Generator Loss: 0.7535
Epoch 1/1	 Discriminator Loss: 1.3738	 Generator Loss: 0.8115
Epoch 1/1	 Discriminator Loss: 1.3775	 Generator Loss: 0.7742
Epoch 1/1	 Discriminator Loss: 1.4209	 Generator Loss: 0.8159
Epoch 1/1	 Discriminator Loss: 1.3927	 Generator Loss: 0.7786
Epoch 1/1	 Discriminator Loss: 1.3889	 Generator Loss: 0.8089
Epoch 1/1	 Discriminator Loss: 1.4213	 Generator Loss: 0.7831
Epoch 1/1	 Discriminator Loss: 1.3942	 Generator Loss: 0.7749
Epoch 1/1	 Discriminator Loss: 1.4032	 Generator Loss: 0.7596
Epoch 1/1	 Discriminator Loss: 1.4035	 Generator Loss: 0.7909
Epoch 1/1	 Discriminator Loss: 1.3968	 Generator Loss: 0.8100
Epoch 1/1	 Discriminator Loss: 1.4111	 Generator Loss: 0.7133
Epoch 1/1	 Discriminator Loss: 1.3973	 Generator Loss: 0.7972
Epoch 1/1	 Discriminator Loss: 1.3896	 Generator Loss: 0.8085
Epoch 1/1	 Discriminator Loss: 1.3944	 Generator Loss: 0.7719
Epoch 1/1	 Discriminator Loss: 1.4278	 Generator Loss: 0.7411
Epoch 1/1	 Discriminator Loss: 1.3702	 Generator Loss: 0.8011
Epoch 1/1	 Discriminator Loss: 1.3844	 Generator Loss: 0.7477
Epoch 1/1	 Discriminator Loss: 1.3812	 Generator Loss: 0.8277
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-22-5c7456fd1d62> in <module>()
     13 with tf.Graph().as_default():
     14     train(epochs, batch_size, z_dim, learning_rate, beta1, celeba_dataset.get_batches,
---> 15           celeba_dataset.shape, celeba_dataset.image_mode)

<ipython-input-20-5b718edb587f> in train(epoch_count, batch_size, z_dim, learning_rate, beta1, get_batches, data_shape, data_image_mode)
     38 #                 _ = sess.run(g_train_opt, feed_dict={z_input: z_sample, lr: learning_rate})
     39                 _ = sess.run(g_train_opt, feed_dict={z_input: z_sample, real_input: batch_images})
---> 40                 _ = sess.run(g_train_opt, feed_dict={z_input: z_sample, real_input: batch_images})
     41 
     42                 if steps % print_interval == 0:

~/anaconda2/envs/deep2/lib/python3.6/site-packages/tensorflow/python/client/session.py in run(self, fetches, feed_dict, options, run_metadata)
    776     try:
    777       result = self._run(None, fetches, feed_dict, options_ptr,
--> 778                          run_metadata_ptr)
    779       if run_metadata:
    780         proto_data = tf_session.TF_GetBuffer(run_metadata_ptr)

~/anaconda2/envs/deep2/lib/python3.6/site-packages/tensorflow/python/client/session.py in _run(self, handle, fetches, feed_dict, options, run_metadata)
    980     if final_fetches or final_targets:
    981       results = self._do_run(handle, final_targets, final_fetches,
--> 982                              feed_dict_string, options, run_metadata)
    983     else:
    984       results = []

~/anaconda2/envs/deep2/lib/python3.6/site-packages/tensorflow/python/client/session.py in _do_run(self, handle, target_list, fetch_list, feed_dict, options, run_metadata)
   1030     if handle is None:
   1031       return self._do_call(_run_fn, self._session, feed_dict, fetch_list,
-> 1032                            target_list, options, run_metadata)
   1033     else:
   1034       return self._do_call(_prun_fn, self._session, handle, feed_dict,

~/anaconda2/envs/deep2/lib/python3.6/site-packages/tensorflow/python/client/session.py in _do_call(self, fn, *args)
   1037   def _do_call(self, fn, *args):
   1038     try:
-> 1039       return fn(*args)
   1040     except errors.OpError as e:
   1041       message = compat.as_text(e.message)

~/anaconda2/envs/deep2/lib/python3.6/site-packages/tensorflow/python/client/session.py in _run_fn(session, feed_dict, fetch_list, target_list, options, run_metadata)
   1019         return tf_session.TF_Run(session, options,
   1020                                  feed_dict, fetch_list, target_list,
-> 1021                                  status, run_metadata)
   1022 
   1023     def _prun_fn(session, handle, feed_dict, fetch_list):

KeyboardInterrupt: 

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_face_generation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "helper.py" and "problem_unittests.py" files in your submission.